hive分隔符

您所在的位置:网站首页 hive 替换制表符 hive分隔符

hive分隔符

2024-06-12 22:20| 来源: 网络整理| 查看: 265

#修改分隔符为逗号 ,

ALTER TABLE table_name SET SERDEPROPERTIES ('field.delim' = ',' , 'serialization.format'=',');

#修改分隔符为\001,在linux的vim中显示为^A,是hive默认的分隔符

ALTER TABLE table_name SET SERDEPROPERTIES ('field.delim' = '\001' , 'serialization.format'='\001');

#修改分隔符为制表符\t

ALTER TABLE table_name SET SERDEPROPERTIES ('field.delim' = '\t' , 'serialization.format'='\t');

重点知识:field.delim 指定表的两个列字段之间的文件中的字段分隔符.serialization.format 指定数据文件序列化时表中两个列字段之间的文件中的字段分隔符.对于分区表,每个分区可以有不同的分隔符属性alter语法修改分区表的分隔符后,不会影响已有分区数据读写,只会对后续新写入的数据生效。这一点非常友好alter语法修改分隔符只针对于后续新增数据有效,拿分区表而言,比如现在有2个分区,day=2020-05-01,day=2020-05-02,分隔符是\t, 通过alter把分隔符改为\001,再写入写的分区day=2020-05-03

可以通过desc formatted tablename partition(key=value)语法查看每个分区的分隔符,那么2020-05-01,2020-05-02的分区依然是\t分隔符,2020-05-03分区的分隔符是\001;而且可以通过hive正常读写操作这三个分区而不会出现任何问题

通过desc formatted table查看该表的分隔符,发现已经变为\001

sqoop中的--fields-terminated-by 参数指定分隔符发生变化后,必须同时通过上面结论中的语法修改对应表 field.delim,serialization.format二者的值。sqoop中--fields-terminated-by 指定\01或者\001,效果是一样的,对应到hive的 field.delim,serialization.format,都是\001hive的默认分割符是\001,在desc formatted 下看到的值为\u0001 ,不要写成其他的\01,\0001操作:1.建一张分区表,指定分隔符为\t

CREATE TABLE `tmp.test0506_sqoop`( `id` bigint, `seq_no` string, `name` string, `e_type` string, `status` string) PARTITIONED BY (`day` string) row format delimited fields terminated by '\t' LINES TERMINATED BY '\n' STORED AS textfile;2.通过sqoop导数据进来,指定分隔符为\t

sqoop import \--mapreduce-job-name sqoop_table_xxx \--hive-drop-import-delims \--connect "${datasource_connect}" \--username ${datasource_username} \--password '${datasource_password}' \--hive-overwrite \--hive-import \--split-by id \--boundary-query 'select min(id),max(id) from xxx' \--hive-table tmp.test0506_sqoop \--query 'select id,seq_no,name,e_type,status from xxx where $CONDITIONS' \--target-dir /tmp/sqoop_test0506_sqoop_`date +%s` \--fields-terminated-by '\t' \--hive-partition-key day \--hive-partition-value '2020-05-01'3.通过alter语法修改表的分隔符为\001

ALTER TABLE tmp.test0506_sqoop SET SERDEPROPERTIES ('field.delim' = '\001' , 'serialization.format'='\001');4.继续用sqoop导数据,指定分隔符为\001

sqoop import \--mapreduce-job-name sqoop_table_xxx \--hive-drop-import-delims \--connect "${datasource_connect}" \--username ${datasource_username} \--password '${datasource_password}' \--hive-overwrite \--hive-import \--split-by id \--boundary-query 'select min(id),max(id) from xxx' \--hive-table tmp.test0506_sqoop \--query 'select id,seq_no,name,e_type,status from xxx where $CONDITIONS' \--target-dir /tmp/sqoop_test0506_sqoop_`date +%s` \--fields-terminated-by '\001' \--hive-partition-key day \--hive-partition-value '2020-05-02'5.查看表,分区的分隔符

desc formatted tmp.test0506_sqoop;

| Storage Desc Params: | field.delim | \u0001 | line.delim | \n | serialization.format | \u0001

desc formatted tmp.test0506_sqoop partition(day='2020-05-01');

| Storage Desc Params: | field.delim | \t | line.delim | \n | serialization.format | \t

desc formatted tmp.test0506_sqoop partition(day='2020-05-02');

| Storage Desc Params: | field.delim | \u0001 | line.delim | \n | serialization.format | \u0001 6.查看表数据,数据显示正常

select * from tmp.test0506_sqoop where day='2020-05-01' limit 2;

select * from tmp.test0506_sqoop where day='2020-05-02' limit 2;

原文链接:https://blog.csdn.net/zbz1006572352/article/details/105976059



【本文地址】


今日新闻


推荐新闻


CopyRight 2018-2019 办公设备维修网 版权所有 豫ICP备15022753号-3